A semilocal convergence analysis for directional Newton methods
نویسندگان
چکیده
منابع مشابه
A semilocal convergence analysis for directional Newton methods
A semilocal convergence analysis for directional Newton methods in n-variables is provided in this study. Using weaker hypotheses than in the elegant related work by Y. Levin and A. Ben-Israel and introducing the center-Lipschitz condition we provide under the same computational cost as in Levin and Ben-Israel a semilocal convergence analysis with the following advantages: weaker convergence co...
متن کاملOn Semilocal Convergence of Inexact Newton Methods
Inexact Newton methods are constructed by combining Newton’s method with another iterative method that is used to solve the Newton equations inexactly. In this paper, we establish two semilocal convergence theorems for the inexact Newton methods. When these two theorems are specified to Newton’s method, we obtain a different Newton-Kantorovich theorem about Newton’s method. When the iterative m...
متن کاملOn Semilocal Convergence of Inexact Newton
Inexact Newton methods are constructed by combining Newton’s method with another iterative method that is used to solve the Newton equations inexactly. In this paper, we establish two semilocal convergence theorems for the inexact Newton methods. When these two theorems are specified to Newton’s method, we obtain a different Newton-Kantorovich theorem about Newton’s method. When the iterative m...
متن کاملA semilocal convergence analysis of an inexact Newton method using recurrent relations
We extend the applicability of an inexact Newton method in order to approximate a locally unique solution of a nonlinear equation in a Banach space setting. The recurrent relations method is used to prove the existence-convergence theorem. Our error bounds are tighter and the information on the location of the solution at least as precise under the same information as before. Our results compar...
متن کاملA Unifying Framework for Convergence Analysis of Approximate Newton Methods
Many machine learning models are reformulated as optimization problems. Thus, it is important to solve a large-scale optimization problem in big data applications. Recently, subsampled Newton methods have emerged to attract much attention for optimization due to their efficiency at each iteration, rectified a weakness in the ordinary Newton method of suffering a high cost in each iteration whil...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Mathematics of Computation
سال: 2010
ISSN: 0025-5718,1088-6842
DOI: 10.1090/s0025-5718-2010-02398-1